Skip to main content
  1. Research Thrusts/

Neuromorphic Sensors in Robotics

I study the characteristics of event cameras as perception sensors for field robotics, as an alternative to conventional sensors (LiDARs, frame-based cameras).

Catching Fast Moving Objects with Events #

Humans (and other animals) are very good at reacting fast based on visual information. In this short video, the pitcher is catching a ball flying at 90 m/s! We tried to replicate some of these results using event cameras.

Baseball pitcher catching a ball.
Baseball pitcher capturing a ball.

Our system was capable of estimating the trajectory of the ball in mid-air, and move a robotic net to catch it. This happens quite fast: we only have 300 ms to capture data, perform calculations, and execute a one-shot motion. Our system was able to capture balls at 14 m/s!

Gif showing the catching sequence.
Catching sequence for throws in different directions. Video playback speed 1/8x.
Paper

High-altitude Orthomapping with Event Cameras #

Orthomapping is the process of generating a map by stitching multiple aerial pictures. Traditionally performed with frame-based cameras, this process is vulnerable to problems such as bright areas and shadowy spots. The goal was to explore how event cameras would perform for this particular task, as they are less vulnerable to these problems due to their higher dynamic range.

Gif showing transition between RGB and events.
Event cameras are able to see detail where RGB can’t, due to its high dynamic range!

We found that fusing events with RGB images helped improve the number of reconstructed pixels in an orthomosaic, particularly in challenging areas of the images.

Fusion representation using RGB and events.
Fusion representations between RGB and events.

This project also helped us develop a platform for high-altitude event camera experiments, that we will be using soon!

Project website Paper

M3ED: High-resolution, Multi-modal, Multi-environment Event Dataset #

High resolution event cameras enable significant improvements in mapping. This short clip shows the difference between MVSEC and this work, M3ED.

MVSEC dataset example.
MVSEC car day.
M3ED dataset example.
M3ED car urban day.

We developed a standard sensor package featuring two high-resolution event cameras, two grayscale global shutter cameras, one global shutter RGB camera, a temperature-compensated IMU and a 3D LiDAR. This platform was mounted in three different platforms: a UAV, a car, and Spot robot. We collected sequences indoor settings, urban, and forests.

Project website Paper